159 research outputs found

    Multilevel Double Loop Monte Carlo and Stochastic Collocation Methods with Importance Sampling for Bayesian Optimal Experimental Design

    Full text link
    An optimal experimental set-up maximizes the value of data for statistical inferences and predictions. The efficiency of strategies for finding optimal experimental set-ups is particularly important for experiments that are time-consuming or expensive to perform. For instance, in the situation when the experiments are modeled by Partial Differential Equations (PDEs), multilevel methods have been proven to dramatically reduce the computational complexity of their single-level counterparts when estimating expected values. For a setting where PDEs can model experiments, we propose two multilevel methods for estimating a popular design criterion known as the expected information gain in simulation-based Bayesian optimal experimental design. The expected information gain criterion is of a nested expectation form, and only a handful of multilevel methods have been proposed for problems of such form. We propose a Multilevel Double Loop Monte Carlo (MLDLMC), which is a multilevel strategy with Double Loop Monte Carlo (DLMC), and a Multilevel Double Loop Stochastic Collocation (MLDLSC), which performs a high-dimensional integration by deterministic quadrature on sparse grids. For both methods, the Laplace approximation is used for importance sampling that significantly reduces the computational work of estimating inner expectations. The optimal values of the method parameters are determined by minimizing the average computational work, subject to satisfying the desired error tolerance. The computational efficiencies of the methods are demonstrated by estimating the expected information gain for Bayesian inference of the fiber orientation in composite laminate materials from an electrical impedance tomography experiment. MLDLSC performs better than MLDLMC when the regularity of the quantity of interest, with respect to the additive noise and the unknown parameters, can be exploited

    Analysis of Discrete L2L^2 L 2 Projection on Polynomial Spaces with Random Evaluations

    Get PDF
    We analyze the problem of approximating a multivariate function by discrete least-squares projection on a polynomial space starting from random, noise-free observations. An area of possible application of such technique is uncertainty quantification for computational models. We prove an optimal convergence estimate, up to a logarithmic factor, in the univariate case, when the observation points are sampled in a bounded domain from a probability density function bounded away from zero and bounded from above, provided the number of samples scales quadratically with the dimension of the polynomial space. Optimality is meant in the sense that the weighted L2L^2 L 2 norm of the error committed by the random discrete projection is bounded with high probability from above by the best L∞L^\infty L ∞ error achievable in the given polynomial space, up to logarithmic factors. Several numerical tests are presented in both the univariate and multivariate cases, confirming our theoretical estimates. The numerical tests also clarify how the convergence rate depends on the number of sampling points, on the polynomial degree, and on the smoothness of the target function

    Recurrence of Clostridium difficile infection in the Western Australian population

    Get PDF
    Clostridium difficile, the most common cause of hospital-associated diarrhoea in developed countries, presents major public health challenges. The high clinical and economic burden from C. difficile infection (CDI) relates to the high frequency of recurrent infections caused by either the same or different strains of C. difficile. An interval of 8 weeks after index infection is commonly used to classify recurrent CDI episodes. We assessed strains of C. difficile in a sample of patients with recurrent CDI in Western Australia from October 2011 to July 2017. The performance of different intervals between initial and subsequent episodes of CDI was investigated. Of 4612 patients with CDI, 1471 (32%) were identified with recurrence. PCR ribotyping data were available for initial and recurrent episodes for 551 patients. Relapse (recurrence with same ribotype (RT) as index episode) was found in 350 (64%) patients and reinfection (recurrence with new RT) in 201 (36%) patients. Our analysis indicates that 8- and 20-week intervals failed to adequately distinguish reinfection from relapse. In addition, living in a non-metropolitan area modified the effect of age on the risk of relapse. Where molecular epidemiological data are not available, we suggest that applying an 8-week interval to define recurrent CDI requires more consideration

    Respiratory illness in a piggery associated with the first identified outbreak of swine influenza in Australia: Assessing the risk to human health and zoonotic potential

    Get PDF
    Australia was previously believed to be free of enzootic swine influenza viruses due strict quarantine practices and use of biosecure breeding facilities. The first proven Australian outbreak of swine influenza occurred in Western Australian in 2012, revealing an unrecognized zoonotic risk, and a potential future pandemic threat. A public health investigation was undertaken to determine whether zoonotic infections had occurred and to reduce the risk of further transmission between humans and swine. A program of monitoring, testing, treatment, and vaccination was commenced, and a serosurvey of workers was also undertaken. No acute infections with the swine influenza viruses were detected. Serosurvey results were difficult to interpret due to previous influenza infections and past and current vaccinations. However, several workers had elevated haemagglutination inhibition (HI) antibody levels to the swine influenza viruses that could not be attributed to vaccination or infection with contemporaneous seasonal influenza A viruses. However, we lacked a suitable control population, so this was inconclusive. The experience was valuable in developing better protocols for managing outbreaks at the human–animal interface. Strict adherence to biosecurity practices, and ongoing monitoring of swine and their human contacts is important to mitigate pandemic risk. Strain specific serological assays would greatly assist in identifying zoonotic transmission

    Generalized parallel tempering on Bayesian inverse problems

    Get PDF
    Funder: Alexander von Humboldt-Stiftung; doi: http://dx.doi.org/10.13039/100005156In the current work we present two generalizations of the Parallel Tempering algorithm, inspired by the so-called continuous-time Infinite Swapping algorithm. Such a method, found its origins in the molecular dynamics community, and can be understood as the limit case of the continuous-time Parallel Tempering algorithm, where the (random) time between swaps of states between two parallel chains goes to zero. Thus, swapping states between chains occurs continuously. In the current work, we extend this idea to the context of time-discrete Markov chains and present two Markov chain Monte Carlo algorithms that follow the same paradigm as the continuous-time infinite swapping procedure. We analyze the convergence properties of such discrete-time algorithms in terms of their spectral gap, and implement them to sample from different target distributions. Numerical results show that the proposed methods significantly improve over more traditional sampling algorithms such as Random Walk Metropolis and (traditional) Parallel Tempering

    An Improved Hazard Rate Twisting Approach for the Statistic of the Sum of Subexponential Variates

    Get PDF
    In this letter, we present an improved hazard rate twisting technique for the estimation of the probability that a sum of independent but not necessarily identically distributed subexponential Random Variables (RVs) exceeds a given threshold. Instead of twisting all the components in the summation, we propose to twist only the RVs which have the biggest impact on the right-tail of the sum distribution and keep the other RVs unchanged. A minmax approach is performed to determine the optimal twisting parameter which leads to an asymptotic optimality criterion. Moreover, we show through some selected simulation results that our proposed approach results in a variance reduction compared to the technique where all the components are twisted

    Unified Importance Sampling Schemes for Efficient Simulation of Outage Capacity Over Generalized Fading Channels

    Get PDF
    The outage capacity (OC) is among the most important performance metrics of communication systems operating over fading channels. Of interest in the present paper is the evaluation of the OC at the output of the Equal Gain Combining (EGC) and the Maximum Ratio Combining (MRC) receivers. In this case, it can be seen that this problem turns out to be that of computing the Cumulative Distribution Function (CDF) for the sum of independent random variables. Since finding a closed-form expression for the CDF of the sum distribution is out of reach for a wide class of commonly used distributions, methods based on Monte Carlo (MC) simulations take pride of price. In order to allow for the estimation of the operating range of small outage probabilities, it is of paramount importance to develop fast and efficient estimation methods as naive MC simulations would require high computational complexity. In this line, we propose in this work two unified, yet efficient, hazard rate twisting Importance Sampling (IS) based approaches that efficiently estimate the OC of MRC or EGC diversity techniques over generalized independent fading channels. The first estimator is shown to possess the asymptotic optimality criterion and applies for arbitrary fading models, whereas the second one achieves the well-desired bounded relative error property for the majority of the well-known fading variates. Moreover, the second estimator is shown to achieve the asymptotic optimality property under the particular Log-normal environment. Some selected simulation results are finally provided in order to illustrate the substantial computational gain achieved by the proposed IS schemes over naive MC simulations

    On the Efficient Simulation of Outage Probability in a Log-Normal Fading Environment

    Get PDF
    The outage probability (OP) of the signal-to interference-plus-noise ratio (SINR) is an important metric that is used to evaluate the performance of wireless systems. One difficulty toward assessing the OP is that, in realistic scenarios, closed-form expressions cannot be derived. This is, for instance, the case of the Log-normal environment, in which evaluating the OP of the SINR amounts to computing the probability that a sum of correlated Log-normal variates exceeds a given threshold. Since such a probability does not admit a closed form expression, it has thus far been evaluated by several approximation techniques, the accuracies of which are not guaranteed in the region of small OPs. For these regions, simulation techniques based on variance reduction algorithms are a good alternative, being quick and highly accurate for estimating rare event probabilities. This constitutes the major motivation behind this paper. More specifically, we propose a generalized hybrid importance sampling scheme, based on a combination of a mean shifting and a covariance matrix scaling, to evaluate the OP of the SINR in a Log-normal environment. We further our analysis by providing a detailed study of two particular cases. Finally, the performance of these techniques is performed both theoretically and through various simulation results

    An Accurate Sample Rejection Estimator of the Outage Probability With Equal Gain Combining

    Get PDF
    We evaluate the outage probability (OP) for L-branch equal gain combining (EGC) receivers operating over fading channels, i.e., equivalently the cumulative distribution function (CDF) of the sum of the L channel envelopes. In general, closed form expressions of OP values are out of reach. The use of Monte Carlo (MC) simulations is not a good alternative as it requires a large number of samples for small values of OP. In this paper, we use the concept of importance sampling (IS), being known to yield accurate estimates using fewer simulation runs. Our proposed IS scheme is based on sample rejection where the IS density is the truncation of the underlying density over the L dimensional sphere. It assumes the knowledge of the CDF of the sum of the L channel gains in closed-form. Such an assumption is not restrictive since it holds for various challenging fading models. We apply our approach to the case of independent Rayleigh, correlated Rayleigh, and independent and identically distributed Rice fading models. Next, we extend our approach to the interesting scenario of generalised selection combining receivers combined with EGC under the independent Rayleigh environment. For each case, we prove the desired bounded relative error property. Finally, we validate these theoretical results through some selected experiments
    • …
    corecore